Goto

Collaborating Authors

 avgpool 2 2




Processing Images from Multiple IACTs in the TAIGA Experiment with Convolutional Neural Networks

Polyakov, Stanislav, Demichev, Andrey, Kryukov, Alexander, Postnikov, Evgeny

arXiv.org Artificial Intelligence

An extensive air shower caused by a high-energy particle (cosmic or gamma ray) interacting with upper atmosphere can be detected by several methods including imaging atmospheric Cherenkov telescopes (IACTs). In Russian TAIGA (Tunka Advanced Instrument for cosmic ray physics and Gamma-ray Astronomy) experiment the number of installed and commissioned IACTs has been increased from one to two in 2020, and the third telescope was installed in 2020 [1]. Convolutional neural networks (CNNs) are a very successful machine learning tool. Several research teams have demonstrated high performance of CNNs for the analysis of images from IACTs and IACT arrays of several gamma astronomy experiments such as VERITAS [2], CTA [3], H.E.S.S. [4]. We previously applied CNNs to the analysis of images from a single TAIGA IACT, specifically, to the problems of identification of the event types and estimation of the energy of the original gamma rays [5, 6]. In this paper we apply convolutional neural networks to the identification of the event types and estimation of the energy of the original gamma rays based on images from one or two TAIGA Cherenkov telescopes and compare the neural network performance in monoscopic and stereoscopic modes.